Inequalities between the Jenson-Shannon and Jeffreys divergences

نویسنده

  • Gavin E. Crooks
چکیده

The last line follows from the previous line by a second application of the same Jensen inequality. Since the J-divergence ranges between zero and positive in nity, whereas the Jensen-Shannon divergence ranges between zero and ln 2 [i.e. 1 bit], this inequality has the correct limits for identical (pi = qi, JS(p;q) = Jeffreys(p;q) = 0) and orthogonal (piqi = 0, JS(p;q) = ln 2, Jeffreys(p;q) = +∞) distributions.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A family of statistical symmetric divergences based on Jensen's inequality

We introduce a novel parametric family of symmetric information-theoretic distances based on Jensen’s inequality for a convex functional generator. In particular, this family unifies the celebrated Jeffreys divergence with the Jensen-Shannon divergence when the Shannon entropy generator is chosen. We then design a generic algorithm to compute the unique centroid defined as the minimum average d...

متن کامل

Generalized Symmetric Divergence Measures and Inequalities

, s = 1 The first measure generalizes the well known J-divergence due to Jeffreys [16] and Kullback and Leibler [17]. The second measure gives a unified generalization of JensenShannon divergence due to Sibson [22] and Burbea and Rao [2, 3], and arithmeticgeometric mean divergence due to Taneja [27]. These two measures contain in particular some well known divergences such as: Hellinger’s discr...

متن کامل

-Divergences and Related Distances

Derivation of tight bounds on f -divergences and related distances is of interest in information theory and statistics. This paper improves some existing bounds on f -divergences. In some cases, an alternative approach leads to a simplified proof of an existing bound. Following bounds on the chi-squared divergence, an improved version of a reversed Pinsker’s inequality is derived for an arbitra...

متن کامل

A Note on Bound for Jensen-Shannon Divergence by Jeffreys

We present a lower bound on the Jensen-Shannon divergence by the Jeffrers’ divergence when pi ≥ qi is satisfied. In the original Lin's paper [IEEE Trans. Info. Theory, 37, 145 (1991)], where the divergence was introduced, the upper bound in terms of the Jeffreys was the quarter of it. In view of a recent shaper one reported by Crooks, we present a discussion on upper bounds by transcendental fu...

متن کامل

Nested Inequalities Among Divergence Measures

In this paper we have considered an inequality having 11 divergence measures. Out of them three are logarithmic such as Jeffryes-Kullback-Leiber [4] [5] J-divergence. Burbea-Rao [1] Jensen-Shannon divergence and Taneja [7] arithmetic-geometric mean divergence. The other three are non-logarithmic such as Hellinger discrimination, symmetric χ−divergence, and triangular discrimination. Three more ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008